A Divergence Formula for Randomness and Dimension

نویسنده

  • Jack H. Lutz
چکیده

If S is an infinite sequence over a finite alphabet Σ and β is a probability measure on Σ, then the dimension of S with respect to β, written dim(S), is a constructive version of Billingsley dimension that coincides with the (constructive Hausdorff) dimension dim(S) when β is the uniform probability measure. This paper shows that dim(S) and its dual Dim(S), the strong dimension of S with respect to β, can be used in conjunction with randomness to measure the similarity of two probability measures α and β on Σ. Specifically, we prove that the divergence formula dim(R) = Dim(R) = H(α) H(α) +D(α||β) holds whenever α and β are computable, positive probability measures on Σ and R ∈ Σ∞ is random with respect to α. In this formula, H(α) is the Shannon entropy of α, and D(α||β) is the Kullback-Leibler divergence between α and β. We also show that the above formula holds for all sequences R that are α-normal (in the sense of Borel) when dim(R) and Dim(R) are replaced by the more effective finite-state dimensions dim FS (R) and DimFS (R). In the course of proving this, we also prove finite-state compression characterizations of dim FS (S) and DimFS (S).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Divergence Formula for Randomness and Dimension (Short Version)

If S is an infinite sequence over a finite alphabet Σ and β is a probability measure on Σ, then the dimension of S with respect to β , written dimβ (S), is a constructive version of Billingsley dimension that coincides with the (constructive Hausdorff) dimension dim(S) when β is the uniform probability measure. This paper shows that dimβ (S) and its dual Dimβ (S), the strong dimension of S with...

متن کامل

Mutual Dimension and Random Sequences

If S and T are infinite sequences over a finite alphabet, then the lower and upper mutual dimensions mdim(S : T ) and Mdim(S : T ) are the upper and lower densities of the algorithmic information that is shared by S and T . In this paper we investigate the relationships between mutual dimension and coupled randomness, which is the algorithmic randomness of two sequences R1 and R2 with respect t...

متن کامل

Particle swarm optimization using dimension selection methods

Particle swarm optimization (PSO) has undergone many changes since its introduction in 1995. Being a stochastic algorithm, PSO and its randomness present formidable challenge for the theoretical analysis of it, and few of the existing PSO improvements have make an effort to eliminate the random coefficients in the PSO updating formula. This paper analyzes the importance of the randomness in the...

متن کامل

Penalized Bregman Divergence Estimation via Coordinate Descent

Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...

متن کامل

Bounds for the dimension of the $c$-nilpotent multiplier of a pair of Lie algebras

‎In this paper‎, ‎we study the Neumann boundary value problem of a class of nonlinear divergence type diffusion equations‎. ‎By a priori estimates‎, ‎difference and variation techniques‎, ‎we establish the existence and uniqueness of weak solutions of this problem.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008